Chebyshev’s inequality

Let XX be a random variable with expectation 𝔼[X]\mathbb{E}[X] and variance σ2=Var[X]\sigma^2 = \mathrm{Var}[X]. Then for any k>0k>0, Pr[|X𝔼[X]|kσ]1k2\mathrm{Pr}[|X-\mathbb{E}[X]|\geq k \cdot \sigma]\leq \frac{1}{k^2}

σ=Var[X]\sigma = \sqrt{\mathrm{Var}[X]} is standard deviation of XX; intuitively the bound is tighter when σ\sigma is smaller.

Proof

This can be derived from Markov’s Inequality (can be considered a corollary or special case of).

Apply Markov’s inequality to the non-negative random variable S=(X𝔼[X])2S=(X-\mathbb{E}[X])^2: Pr[(X𝔼[X])2k2σ2]𝔼[(X𝔼[X])2]k2σ2=σ2k2σ2=1k2\mathrm{Pr}[(X-\mathbb{E}[X])^2 \geq k^2 \sigma^2] \leq \frac{\mathbb{E}[(X-\mathbb{E}[X])^2]}{k^2 \sigma^2} = \frac{\sigma^2}{k^2 \sigma^2} = \frac{1}{k^2}


Example of Concentration inequality.


see also: multidimensional Chebyshev’s inequality